Chat GPT
(We need to talk)
Since its public release at the end of 2022, ChatGPT – the artificial intelligence chatbot developed by OpenAI – has experienced rapid growth and widespread adoption. Its role in education, however, remains a topic of contention. While some view it as a tool to enhance learning and reduce teacher workload, others see it as a threat to integrity which opens the door to cheating and plagiarism.
In this Q&A, two researchers from the Faculty of Education offer a more nuanced perspective on the opportunities, challenges and possibilities of ChatGPT. Dr Vaughan Connolly is a teacher and researcher with interests in the role of technology in education and teacher workload. Dr Steve Watson is co-convener of the Faculty’s Knowledge, Power and Politics research cluster, and has used social systems theory to explore the meaning and communication of ChatGPT.
Q: Why does Chat GPT matter for educators?
Vaughan Connolly: ChatGPT represents a tipping point in the development of AI and we teachers ignore it at our peril. For educators, it’s going to be as transformational as Google was in 1998, and requires a serious conversation about the benefits, challenges and implications for schools and learners. The future will be changed by it indelibly. Educators have to start engaging with it in a meaningful way.
Steve Watson: I’d characterise ChatGPT as a hugely powerful assistive technology. For producing text, throwing around ideas and playing with them, it’s transformative. Historically, most communication innovations have tried to improve the exchange of meaning and avoid misunderstanding. Education is, in many ways, also concerned with this. I’d urge educators to start thinking about what ChatGPT can and cannot do from that perspective.
Q: Much of the debate in education so far has been about whether ChatGPT should be banned or embraced. Where do you stand?
SW: We have to move beyond thinking purely in terms of dystopianism or boosterism; ChatGPT brings both new opportunities and new complexity. So far, it’s mostly been portrayed as a tool for content creation. In fact, generating content is probably its weakest function. Where it excels is at manipulating structure and form. Everyone seems to be worrying that ChatGPT can produce an essay or coursework that will pass exams, but what that means is that educators need to adapt and say, “OK, what can we do with that?”
VC: With its rapid growth and especially the leap from version 3 to 4, the conversation about banning it is becoming irrelevant. Transformations will accelerate as these systems train themselves, and indeed this is already happening. The question is not whether to use ChatGPT in schools, but how to do so safely, effectively and appropriately. Schools need to take the initiative and figure that out, or risk putting themselves and their students at a disadvantage. I really think this question is very urgent for the sector. I’m worried that the same urgency is not characterising our response.
"It could shift the focus of education towards critical thinking and big questions."
Q: What are some ways in which ChatGPT might potentially change the way people teach and learn in universities and/or schools?
SW: My problem at school was that I had all the ideas about what I wanted to say, but didn’t always know how to express it in the right form. This technology can help students present ideas in a clear and organised manner and in the right form, allowing teachers to focus on the ideas themselves.
In the process, it could shift the focus of education towards critical thinking and big questions. Because it can also summarise an argument, knowledge and concepts, there is real potential to help with formative assessment, especially in situations where teachers have limited time.
VC: I have used ChatGPT in a GCSE computer science class to prompt students to verify claims, seek further detail and challenge information. The conversation was really edifying. I’ve also tested it as a tool for homework and revision, taking the perspective of students in different subjects and phases. Posing as a Key Stage 4 History student, for example, I had a very informative chat about 19th century cholera. One can envisage situations in which it might really help students with revision, checking homework answers, or refining an essay. It has also helped postgraduates for whom English is a second language clarify what they mean in their writing. They can give it a paragraph and ask it to rewrite it in an academic style. The results have been impressive, and it reduces the cognitive burden of translating for these students by allowing them to focus on the content, rather than the technical aspects of their writing in a less familiar language.
Q: How would you characterise the main challenges that ChatGPT raises for the education sector?
VC: My number one concern is that we don’t ignore it. It’s already being used very widely and the cost of replicating it is much lower than that of inventing it.
There have already been some negative consequences – for example, recent issues with the Snapchat-based chatbot, My AI coaching users in harmful behaviours. The ubiquity of the technology means that educators need to be aware how to use it, the associated dangers, and how to encourage safe use.
SW: An innovation like this is always a double-edged sword: we can’t just treat it as heralding the arrival of a new utopia. In many ways it takes us back to the old Douglas Adams idea of Deep Thought: we’re very attracted by the prospect of something acting as a source of ultimate intelligence and informational knowledge, but this is not a trustworthy, independent source of information. You can’t take the human operator out of ChatGPT. The challenge for educators is to probe its capacities and limitations and make it work for them.
"The technology needs to be trained on diverse datasets, from different countries and in different languages. If I was working for a large education organisation, I’d be putting pressure on the developers."
Q: How should educators handle the potential for ChatGPT to reproduce certain social biases and inequalities?
SW: This takes us back to the potential that people might use ChatGPT, wrongly, as a source of content and knowledge. We’re already seeing cases where people are critiquing this, posting on Twitter that they asked it for a list of 10 famous philosophers, for example, and it gave them a list of 10 white men.
It’s important to stress that this is an assistive tool: its primary value is that it can transform the structure and form of text while maintaining and stabilising meaning. Educators should be mindful of this and highlight its limitations as a source of knowledge.
VC: The first thing I’d observe is that the same issues affect many digital resources, like Wikipedia and Google searches. The creators of this technology need to make sure that it is being trained on diverse datasets, from different countries and in different languages. If I was working for a large education organisation, I’d be putting pressure on the developers to do this. Teachers, meanwhile, have a responsibility to train students to use the technology properly. I frequently explain to students, for example, that a Google search on a particular subject might carry inherent biases in terms of gender, culture and race. These considerations are the part of any good education programme that teaches people to use technology wisely.
Q: Are there ways in which we should be restricting the use of ChatGPT in education?
SW: There’s definitely an issue with regulation and we have to be respectful of the need for universities and schools to protect academic and educational integrity. That’s why some universities have taken the view that if students use ChatGPT to generate content, it’s misconduct. The problem is that in our efforts to preserve integrity we might become unresponsive to change that is happening anyway as the technology is adopted on a large scale. If we over-regulate ChatGPT, that gap will widen and the institutional position will be irrelevant.
VC: Rather than restricting the use of ChatGPT in education, bodies such as the Department for Education could hold the people who provide these services to account so that the potential for harmful use is minimised. There may be a place for this in legislation such as the Online Safety Bill, for example. Digital skills are part of the curriculum already, so the Government should be considering how to ensure that schools teach students to be safe and effective critical users of this technology – just as they are already supported to stay safe online. I’d like to see this sort of critical training becoming part of not just PSHE and Computer Science, but also other curriculum areas ideally led by experts with a deep understanding of such technologies.
"This is an iPhone moment."
Q: At a local level, how can universities and schools respond to ChatGPT’s emergence in a meaningful way?
VC: This is an iPhone moment and institutions should bring together staff from different departments, and both academic and non-academic perspectives, to explore ChatGPT and put it through its paces. We have a responsibility to ask, “What is this, and how could we use it?” If you teach maths, could it help with revision? If you’re a computer science teacher, how can it help students to code? From a workload perspective, can it speed up how you prepare for lessons? We need a ground-up approach to working this out, but there’s also space for organisations like the Chartered College of Teaching, for example, to guide that process and enable schools to share their experiences and findings.
SW: The thing to remember is that we can’t just import this technology and expect it to work. We need an iterative process of design, development and research in situ. There will be lots of people – especially students – who are miles ahead with this already. If we can harness that dynamism by getting groups of people to collaborate and test it, the guidelines and structures we need to place around the technology will start to emerge. There is a huge opportunity for centres of educational research like our own Faculty to collaborate with schools on this.
Q: Is education set up to cope with AI?
VC: I really don’t think schools can wait for the Government to tell them how to handle this. A model of distributed leadership is necessary, with staff empowered to come together and discuss the use of ChatGPT. School leaders need to create time and space to have the discussion. It’s also important to recognise that students have a more nuanced appreciation of the technology than we sometimes give them credit. Those I’ve spoken to are concerned about the need to avoid using ChatGPT as a shortcut in their education, lest this undermine their learning. It’s important students are part of the conversation.
SW: The traditional approach of regulating out any potential threats to the way educational organisations operate won’t work in this case. That’s why we need to see collaboration between executives and practitioners to determine appropriate use. ChatGPT is showing us that we can’t always control, predict and calculate our way through risks to education on a top-down basis. Our normal decision-making hierarchies need to be relaxed so that different parts of the education system can experiment with it and refine their usage of it – while remaining in constant dialogue with leaders and executives.